fix: update robots.txt and sitemap.xml for improved crawler management #8
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Type of Change
Please select the type of change (uncomment one):
✅ Bug Fix
✅ Documentation
Summary
Briefly describe what this PR changes:
Adds/validates a Google Search Console–friendly sitemap and updates robots.txt to match our current site structure and crawling goals (public pages allowed, internal files restricted). Also removes Crawl-delay from the Googlebot section to avoid Search Console warnings while still rate-limiting other bots.
Changes Made
List specific files or sections changed:
Testing Checklist
Please confirm you've tested the following:
bundle exec jekyll servelocally without errorsScreenshots (if applicable)
Related Issues
Additional Notes
By submitting this PR, I confirm: